Goto

Collaborating Authors

 powerset cnn


Powerset Convolutional Neural Networks

Chris Wendler, Markus Püschel, Dan Alistarh

Neural Information Processing Systems

However, an artifact of many of these models is that regularity priors are hidden in their fundamental neural building blocks, which makes it impossible to apply them directly to irregular data domains.


ground set of size n. Note that set functions are inherently 2

Neural Information Processing Systems

We thank the reviewers for their comments. We will incorporate all points and suggested clarifications. Convolution is done efficiently in the Fourier domain, i.e., Hence, they are in the same complexity class. What is the norm on s: 2 Q. What is the difference between the two proposed models (l.232)?




Reviews: Powerset Convolutional Neural Networks

Neural Information Processing Systems

The authors built their work on top of "A discrete signal processing framework for set functions" where powerset convolutions were defined, adding powerset pooling operations and defining powerset convolutional neural networks that can be used to classify set functions. The authors provided a detailed analysis of the kind of patterns that powerset convolutions are sensitive to from a pattern matching perspective, and defined their implementation. The authors recognize the exponential growth of complexity O(n2 n) and that to scale their approach to larger ground sets, which limits the applicability of the current method. The empirical results show that the powerset CNNs perform similarly to the baselines on both the synthetic and real datasets, maybe the tasks chosen are too small or well suited to showcase the proposed powerset CNNs. The authors recognizes the lack of datasets containing set functions well suited for their method, however the current set of experiments weakens the argument than powerset CNNs can handle set functions better than graph-convolutional baselines.


Powerset Convolutional Neural Networks

Wendler, Chris, Alistarh, Dan, Püschel, Markus

arXiv.org Machine Learning

We present a novel class of convolutional neural networks (CNNs) for set functions, i.e., data indexed with the powerset of a finite set. The convolutions are derived as linear, shift-equivariant functions for various notions of shifts on set functions. The framework is fundamentally different from graph convolutions based on the Laplacian, as it provides not one but several basic shifts, one for each element in the ground set. Prototypical experiments with several set function classification tasks on synthetic datasets and on datasets derived from real-world hypergraphs demonstrate the potential of our new powerset CNNs.